12 research outputs found

    Particle Swarm Optimization Using Multiple Neighborhood Connectivity And Winner Take All Activation Applied To Biophysical Models Of Inferior Colliculus Neurons

    Get PDF
    Age-related hearing loss is a prevalent neurological disorder, affecting as many as 63% of adults over the age of 70. The inability to hear and understand speech is a cause of much distress in aged individuals and is becoming a major public health concern as age-related hearing loss has also been correlated with other neurological disorders such as Alzheimer\u27s dementia. The Inferior Colliculus (IC) is a major integrative auditory center, receiving excitatory and inhibitory inputs from several brainstem nuclei. This complex balance of excitation and inhibition gives rise to complex neural responses, which are measured in terms of firing rate as a given parameter is varied. A major obstacle in understanding the mechanisms involved in generating normal and aberrant auditory responses is estimating the strength and tuning of excitatory and inhibitory inputs that are integrated to form the output firing of IC neurons. To better understand IC response generation, biophysically accurate, conductance-based computational models were used to recreate IC frequency tuning responses. The problem of fitting response curves in vivo was approached using particle swarm optimization, an optimization paradigm which mimics social networks of flocking birds to solve problems. A new social network modeling winner-take-all activation found in visual neuron coding was developed in which agents are divided into social hierarchies and compete for leadership rights. This social network has shown good performance in benchmark optimization problems and is used to recreate IC frequency tuning responses which can be used to further understand pathological aging in the auditory system

    Affective Image Sequence Viewing in Virtual Reality Theater Environment: Frontal Alpha Asymmetry Responses From Mobile EEG

    Get PDF
    Background: Numerous studies have investigated emotion in virtual reality (VR) experiences using self-reported data in order to understand valence and arousal dimensions of emotion. Objective physiological data concerning valence and arousal has been less explored. Electroencephalography (EEG) can be used to examine correlates of emotional responses such as valence and arousal in virtual reality environments. Used across varying fields of research, images are able to elicit a range of affective responses from viewers. In this study, we display image sequences with annotated valence and arousal values on a screen within a virtual reality theater environment. Understanding how brain activity responses are related to affective stimuli with known valence and arousal ratings may contribute to a better understanding of affective processing in virtual reality.Methods: We investigated frontal alpha asymmetry (FAA) responses to image sequences previously annotated with valence and arousal ratings. Twenty-four participants viewed image sequences in VR with known valence and arousal values while their brain activity was recorded. Participants wore the Oculus Quest VR headset and viewed image sequences while immersed in a virtual reality theater environment.Results: Image sequences with higher valence ratings elicited greater FAA scores than image sequences with lower valence ratings (F [1, 23] = 4.631, p = 0.042), while image sequences with higher arousal scores elicited lower FAA scores than image sequences with low arousal (F [1, 23] = 7.143, p = 0.014). The effect of valence on alpha power did not reach statistical significance (F [1, 23] = 4.170, p = 0.053). We determined that only the high valence, low arousal image sequence elicited FAA which was significantly higher than FAA recorded during baseline (t [23] = −3.166, p = 0.002), suggesting that this image sequence was the most salient for participants.Conclusion: Image sequences with higher valence, and lower arousal may lead to greater FAA responses in VR experiences. While findings suggest that FAA data may be useful in understanding associations between valence and arousal self-reported data and brain activity responses elicited from affective experiences in VR environments, additional research concerning individual differences in affective processing may be informative for the development of affective VR scenarios

    Cue Reactivity in Active Pathological, Abstinent Pathological, and Regular Gamblers

    Get PDF
    Twenty-one treatment-seeking pathological gamblers, 21 pathological gamblers in recovery, and 21 recreational gamblers watched two video-taped exciting gambling scenarios and an exciting roller-coaster control scenario while their arousal (heart rate and subjective excitement) and urge to gamble were being measured. The gamblers did not differ significantly in cue-elicited heart rate elevations or excitement. However, the active pathological gamblers reported significantly greater urges to gamble across all cues compared to the abstinent pathological gamblers and, with marginal significance (p = 0.06), also compared to the social gamblers. Further exploration of these findings revealed that active pathological gamblers experience urges to gamble in response to exciting situations, whether or not they are gambling related, whereas abstinent and social gamblers only report urges to an exciting gambling-related cue. This suggests that for pathological gamblers excitement itself, irrespective of its source, may become a conditioned stimulus capable of triggering gambling behavior. Implications for treatment and future research are discussed

    Affective Image Sequence Viewing in Virtual Reality Theater Environment: Frontal Alpha Asymmetry Responses From Mobile EEG

    Get PDF
    Background: Numerous studies have investigated emotion in virtual reality (VR) experiences using self-reported data in order to understand valence and arousal dimensions of emotion. Objective physiological data concerning valence and arousal has been less explored. Electroencephalography (EEG) can be used to examine correlates of emotional responses such as valence and arousal in virtual reality environments. Used across varying fields of research, images are able to elicit a range of affective responses from viewers. In this study, we display image sequences with annotated valence and arousal values on a screen within a virtual reality theater environment. Understanding how brain activity responses are related to affective stimuli with known valence and arousal ratings may contribute to a better understanding of affective processing in virtual reality. Methods: We investigated frontal alpha asymmetry (FAA) responses to image sequences previously annotated with valence and arousal ratings. Twenty-four participants viewed image sequences in VR with known valence and arousal values while their brain activity was recorded. Participants wore the Oculus Quest VR headset and viewed image sequences while immersed in a virtual reality theater environment. Results: Image sequences with higher valence ratings elicited greater FAA scores than image sequences with lower valence ratings (F [1, 23] = 4.631, p = 0.042), while image sequences with higher arousal scores elicited lower FAA scores than image sequences with low arousal (F [1, 23] = 7.143, p = 0.014). The effect of valence on alpha power did not reach statistical significance (F [1, 23] = 4.170, p = 0.053). We determined that only the high valence, low arousal image sequence elicited FAA which was significantly higher than FAA recorded during baseline (t [23] = −3.166, p = 0.002), suggesting that this image sequence was the most salient for participants. Conclusion: Image sequences with higher valence, and lower arousal may lead to greater FAA responses in VR experiences. While findings suggest that FAA data may be useful in understanding associations between valence and arousal self-reported data and brain activity responses elicited from affective experiences in VR environments, additional research concerning individual differences in affective processing may be informative for the development of affective VR scenarios
    corecore